2-SSVR: A Smooth Support Vector Machine for 2-insensitive Regression

نویسندگان

  • Yuh-Jye Lee
  • Wen-Feng Hsieh
  • Chien-Ming Huang
چکیده

A new smoothing strategy for solving 2-support vector regression (2-SVR), tolerating a small error in fitting a given dataset linearly or nonlinearly, is proposed in this paper. Conventionally, 2-SVR is formulated as a constrained minimization problem, namely a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the 2-insensitive loss function by an accurate smooth approximation. This will allow us to solve 2-SVR as an unconstrained minimization problem directly. We term this reformulated problem as 2-smooth support vector regression (2-SSVR). We also prescribe a NewtonArmijo algorithm that has been shown to be convergent globally and quadratically in finite steps to solve our 2-SSVR. In order to handle the case of nonlinear Department of Computer Science & Information Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan 106, [email protected]. Department of Computer Science & Information Engineering, National Chung Cheng University, Chia-Yi, Taiwan 621, [email protected]. ‡ Department of Computer Science & Information Engineering, National Taiwan University of Science and Technology, Taipei, Taiwan 106, [email protected].

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Study Of E-Smooth Support Vector Regression And Comparison With E- Support Vector Regression And Potential Support Vector Machines For Prediction For The Antitubercular Activity Of Oxazolines And Oxazoles Derivatives

A new smoothing method for solving ε -support vector regression (ε-SVR), tolerating a small error in fitting a given data sets nonlinearly is proposed in this study. Which is a smooth unconstrained optimization reformulation of the traditional linear programming associated with a ε-insensitive support vector regression. We term this redeveloped problem as ε-smooth support vector regression (ε-S...

متن کامل

Selecting a Reduced Set for Building Sparse Support Vector Regression in the Primal

Recent work shows that Support vector machines (SVMs) can be solved efficiently in the primal. This paper follows this line of research and shows how to build sparse support vector regression (SVR) in the primal, thus providing for us scalable, sparse support vector regression algorithm, named SSVR-SRS. Empirical comparisons show that the number of basis functions required by the proposed algor...

متن کامل

Application of Support Vector Machine Regression for Predicting Critical Responses of Flexible Pavements

This paper aims to assess the application of Support Vector Machine (SVM) regression in order to analysis flexible pavements. To this end, 10000 Four-layer flexible pavement sections consisted of asphalt concrete layer, granular base layer, granular subbase layer, and subgrade soil were analyzed under the effect of standard axle loading using multi-layered elastic theory and pavement critical r...

متن کامل

Generalized recurrent neural network for ϵ-insensitive support vector regression

In this paper, a generalized recurrent neural network is proposed for solving -insensitive support vector regression ( -ISVR). The -ISVR is first formulated as a convex non-smooth programming problem, and then a generalize recurrent neural network with lower model complexity is designed for training the support vector machine. Furthermore, simulation results are given to demonstrate the effecti...

متن کامل

Support vector regression with random output variable and probabilistic constraints

Support Vector Regression (SVR) solves regression problems based on the concept of Support Vector Machine (SVM). In this paper, a new model of SVR with probabilistic constraints is proposed that any of output data and bias are considered the random variables with uniform probability functions. Using the new proposed method, the optimal hyperplane regression can be obtained by solving a quadrati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004